On the Power of Truncated SVD for General High-rank Matrix Estimation Problems

نویسندگان

  • Simon S. Du
  • Yining Wang
  • Aarti Singh
چکیده

We show that given an estimate  that is close to a general high-rank positive semidefinite (PSD) matrix A in spectral norm (i.e., ‖Â−A‖2 ≤ δ), the simple truncated Singular Value Decomposition of  produces a multiplicative approximation of A in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1. High-rank matrix completion: we show that it is possible to recover a general high-rank matrix A up to (1 + ε) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of A. 2. High-rank matrix denoising: we design an algorithm that recovers a matrix A with error in Frobenius norm from its noise-perturbed observations, without assuming A is exactly low-rank. 3. Low-dimensional approximation of high-dimensional covariance: given N i.i.d. samples of dimension n from Nn(0,A), we show that it is possible to approximate the covariance matrix A with relative error in Frobenius norm with N ≈ n, improving over classical covariance estimation results which requires N ≈ n.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Truncated Linear Minimax Estimator of a Power of the Scale Parameter in a Lower- Bounded Parameter Space

 Minimax estimation problems with restricted parameter space reached increasing interest within the last two decades Some authors derived minimax and admissible estimators of bounded parameters under squared error loss and scale invariant squared error loss In some truncated estimation problems the most natural estimator to be considered is the truncated version of a classic...

متن کامل

Faster SVD-Truncated Least-Squares Regression

We develop a fast algorithm for computing the “SVD-truncated” regularized solution to the leastsquares problem: minx ‖Ax − b‖2. Let Ak of rank k be the best rank k matrix computed via the SVD of A. Then, the SVD-truncated regularized solution is: xk = A † k b. If A is m × n, then, it takes O(mnmin{m,n}) time to compute xk using the SVD of A. We give an approximation algorithm for xk which const...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Face Recognition Based Rank Reduction SVD Approach

Standard face recognition algorithms that use standard feature extraction techniques always suffer from image performance degradation. Recently, singular value decomposition and low-rank matrix are applied in many applications,including pattern recognition and feature extraction. The main objective of this research is to design an efficient face recognition approach by combining many tech...

متن کامل

Introducing a New Lifetime Distribution of Power Series Distribution of the Family Gampertz

In this Paper, We propose a new three-parameter lifetime of Power Series distributions of the Family Gampertz with decreasing, increasing, increasing-decreasing and unimodal Shape failure rate. The distribution is a Compound version of of the Gampertz and Zero-truncated Possion distributions, called the Gampertz-Possion distribution (GPD). The density function, the hazard rate function, a gener...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017